60 research outputs found
Diffusion, localization and dispersion relations on ``small-world'' lattices
The spectral properties of the Laplacian operator on ``small-world''
lattices, that is mixtures of unidimensional chains and random graphs
structures are investigated numerically and analytically. A transfer matrix
formalism including a self-consistent potential a la Edwards is introduced. In
the extended region of the spectrum, an effective medium calculation provides
the density of states and pseudo relations of dispersion for the eigenmodes in
close agreement with the simulations. Localization effects, which are due to
connectivity fluctuations of the sites are shown to be quantitatively described
by the single defect approximation recently introduced for random graphs.Comment: 17 revtex pages, 16 eps figures + 2 table
Heuristic average-case analysis of the backtrack resolution of random 3-Satisfiability instances
An analysis of the average-case complexity of solving random 3-Satisfiability
(SAT) instances with backtrack algorithms is presented. We first interpret
previous rigorous works in a unifying framework based on the statistical
physics notions of dynamical trajectories, phase diagram and growth process. It
is argued that, under the action of the Davis--Putnam--Loveland--Logemann
(DPLL) algorithm, 3-SAT instances are turned into 2+p-SAT instances whose
characteristic parameters (ratio alpha of clauses per variable, fraction p of
3-clauses) can be followed during the operation, and define resolution
trajectories. Depending on the location of trajectories in the phase diagram of
the 2+p-SAT model, easy (polynomial) or hard (exponential) resolutions are
generated. Three regimes are identified, depending on the ratio alpha of the
3-SAT instance to be solved. Lower sat phase: for small ratios, DPLL almost
surely finds a solution in a time growing linearly with the number N of
variables. Upper sat phase: for intermediate ratios, instances are almost
surely satisfiable but finding a solution requires exponential time (2 ^ (N
omega) with omega>0) with high probability. Unsat phase: for large ratios,
there is almost always no solution and proofs of refutation are exponential. An
analysis of the growth of the search tree in both upper sat and unsat regimes
is presented, and allows us to estimate omega as a function of alpha. This
analysis is based on an exact relationship between the average size of the
search tree and the powers of the evolution operator encoding the elementary
steps of the search heuristic.Comment: to appear in Theoretical Computer Scienc
Theoretical study of collective modes in DNA at ambient temperature
The instantaneous normal modes corresponding to base pair vibrations (radial
modes) and twist angle fluctuations (angular modes) of a DNA molecule model at
ambient temperature are theoretically investigated. Due to thermal disorder,
normal modes are not plane waves with a single wave number q but have a finite
and frequency dependent damping width. The density of modes rho(nu), the
average dispersion relation nu(q) as well as the coherence length xi(nu) are
analytically calculated. The Gibbs averaged resolvent is computed using a
replicated transfer matrix formalism and variational wave functions for the
ground and first excited state. Our results for the density of modes are
compared to Raman spectroscopy measurements of the collective modes for DNA in
solution and show a good agreement with experimental data in the low frequency
regime nu < 150 cm^{-1}. Radial modes extend over frequencies ranging from 50
cm^{-1} to 110 cm^{-1}. Angular modes, related to helical axis vibrations are
limited to nu < 25 cm^{-1}. Normal modes are highly disordered and coherent
over a few base pairs only (xi < 2 nm) in good agreement with neutron
scattering experiments.Comment: 20 pages + 13 ps figure
Fast Inference of Interactions in Assemblies of Stochastic Integrate-and-Fire Neurons from Spike Recordings
We present two Bayesian procedures to infer the interactions and external
currents in an assembly of stochastic integrate-and-fire neurons from the
recording of their spiking activity. The first procedure is based on the exact
calculation of the most likely time courses of the neuron membrane potentials
conditioned by the recorded spikes, and is exact for a vanishing noise variance
and for an instantaneous synaptic integration. The second procedure takes into
account the presence of fluctuations around the most likely time courses of the
potentials, and can deal with moderate noise levels. The running time of both
procedures is proportional to the number S of spikes multiplied by the squared
number N of neurons. The algorithms are validated on synthetic data generated
by networks with known couplings and currents. We also reanalyze previously
published recordings of the activity of the salamander retina (including from
32 to 40 neurons, and from 65,000 to 170,000 spikes). We study the dependence
of the inferred interactions on the membrane leaking time; the differences and
similarities with the classical cross-correlation analysis are discussed.Comment: Accepted for publication in J. Comput. Neurosci. (dec 2010
The Entropy of the K-Satisfiability Problem
The threshold behaviour of the K-Satisfiability problem is studied in the
framework of the statistical mechanics of random diluted systems. We find that
at the transition the entropy is finite and hence that the transition itself is
due to the abrupt appearance of logical contradictions in all solutions and not
to the progressive decreasing of the number of these solutions down to zero. A
physical interpretation is given for the different cases , and .Comment: revtex, 11 pages + 1 figur
Learning and generalization theories of large committee--machines
The study of the distribution of volumes associated to the internal
representations of learning examples allows us to derive the critical learning
capacity () of large committee machines,
to verify the stability of the solution in the limit of a large number of
hidden units and to find a Bayesian generalization cross--over at .Comment: 14 pages, revte
Analysis of the computational complexity of solving random satisfiability problems using branch and bound search algorithms
The computational complexity of solving random 3-Satisfiability (3-SAT)
problems is investigated. 3-SAT is a representative example of hard
computational tasks; it consists in knowing whether a set of alpha N randomly
drawn logical constraints involving N Boolean variables can be satisfied
altogether or not. Widely used solving procedures, as the
Davis-Putnam-Loveland-Logeman (DPLL) algorithm, perform a systematic search for
a solution, through a sequence of trials and errors represented by a search
tree. In the present study, we identify, using theory and numerical
experiments, easy (size of the search tree scaling polynomially with N) and
hard (exponential scaling) regimes as a function of the ratio alpha of
constraints per variable. The typical complexity is explicitly calculated in
the different regimes, in very good agreement with numerical simulations. Our
theoretical approach is based on the analysis of the growth of the branches in
the search tree under the operation of DPLL. On each branch, the initial 3-SAT
problem is dynamically turned into a more generic 2+p-SAT problem, where p and
1-p are the fractions of constraints involving three and two variables
respectively. The growth of each branch is monitored by the dynamical evolution
of alpha and p and is represented by a trajectory in the static phase diagram
of the random 2+p-SAT problem. Depending on whether or not the trajectories
cross the boundary between phases, single branches or full trees are generated
by DPLL, resulting in easy or hard resolutions.Comment: 37 RevTeX pages, 15 figures; submitted to Phys.Rev.
Reconstructing a Random Potential from its Random Walks
The problem of how many trajectories of a random walker in a potential are
needed to reconstruct the values of this potential is studied. We show that
this problem can be solved by calculating the probability of survival of an
abstract random walker in a partially absorbing potential. The approach is
illustrated on the discrete Sinai (random force) model with a drift. We
determine the parameter (temperature, duration of each trajectory, ...) values
making reconstruction as fast as possible
Innovation rather than improvement: a solvable high-dimensional model highlights the limitations of scalar fitness
Much of our understanding of ecological and evolutionary mechanisms derives
from analysis of low-dimensional models: with few interacting species, or few
axes defining "fitness". It is not always clear to what extent the intuition
derived from low-dimensional models applies to the complex, high-dimensional
reality. For instance, most naturally occurring microbial communities are
strikingly diverse, harboring a large number of coexisting species, each of
which contributes to shaping the environment of others. Understanding the
eco-evolutionary interplay in these systems is an important challenge, and an
exciting new domain for statistical physics. Recent work identified a promising
new platform for investigating highly diverse ecosystems, based on the classic
resource competition model of MacArthur. Here, we describe how the same
analytical framework can be used to study evolutionary questions. Our analysis
illustrates how, at high dimension, the intuition promoted by a one-dimensional
(scalar) notion of fitness can become misleading. Specifically, while the
low-dimensional picture emphasizes organism cost or efficiency, we exhibit a
regime where cost becomes irrelevant for survival, and link this observation to
generic properties of high-dimensional geometry.Comment: 8 pages, 4 figures + Supplementary Materia
- âŠ